Discover Top MCP Servers - Improve Your AI Workflows
One-Stop MCP Server & Client Integration - 121,231 Services Listed
Categories
No LimitDeveloper toolsArtificial intelligence chatbotsResearch and dataKnowledge management and memoryEducation and learning toolsDatabaseFinanceSearch toolsSecurityVersion controlCloud platformImage and video processingMonitoringCommunication toolsOperating system automationEntertainment and mediaGames and gamificationNote-taking toolsSchedule managementMarketingHome automation and IoTLocation servicesBrowser automationFile systemE-commerce and retailCustomer supportSocial mediaVoice processingHealth and wellnessCustomer data platformTravel and transportationVirtualizationCloud storageLaw and complianceArt and cultureOtherLanguage translation
Authentication Status
No LimitOfficial CertificationUnofficial Certification
Location
No LimitLocalRemote
Programming Language
No LimitC# GoJavaJavaScriptPythonRustTypeScript
Type
Filter
Found a total of 54 results related to

Notebook Intelligence
Notebook Intelligence (NBI) is an AI coding assistant and extensible AI framework designed for JupyterLab. It supports models from GitHub Copilot and other LLM providers, including local Ollama models. It significantly enhances productivity through features such as code generation, auto - completion, and chat interfaces, and supports the integration of Model Context Protocol (MCP) services.
Python
23.2K
3 points

Strawhatai Dev
The StrawHat AI development repository provides a complete solution for building local and online AI development environments, including various tools such as the Ollama backend, Open - WebUI frontend, and Claude desktop proxy, supporting rapid deployment and development of AI applications.
7.7K
2.5 points

Ollama PostgreSQL Data Analysis
An interactive chat assistant that combines the capabilities of the Ollama large - model and PostgreSQL database access, supporting natural language querying of databases and generating SQL queries.
TypeScript
6.3K
2.5 points

MCP Server Ollama Deep Researcher
This project is a deep research server based on the Ollama local large - model, providing automated research tools through the MCP protocol, capable of iterative search, analysis, and summarization of complex topics.
Python
6.8K
2.5 points

Ollama Deep Researcher
A deep research MCP service based on Ollama, realizing automated web search and knowledge synthesis through local LLM models
Python
8.4K
2.5 points

MCP Jpl
This project is an experimental Model Context Protocol (MCP) server for interacting with the Algolia API. It allows users to perform operations such as searching, adding data, and updating index configurations through the MCP protocol. The project provides installation guides, debugging methods, and instructions for integrating with Ollama, but clearly states that it does not provide official support.
Go
6.4K
2.5 points

Deepseek Thinker MCP
The Deepseek Thinker MCP Server is an MCP service that provides Deepseek inference content, supporting both the OpenAI API and the local Ollama mode, and can be integrated into AI clients.
TypeScript
7.5K
2.5 points

Otter Bridge
OtterBridge is a lightweight MCP server used to connect applications with multiple large language model providers, supporting models such as Ollama, with a simple and flexible design.
Python
6.7K
2.5 points

Multi Model Advisor
The Multi-Model Advisor is a multi - model consulting server based on Ollama, which provides more comprehensive answers to questions by integrating the viewpoints of multiple AI models.
TypeScript
9.1K
2.5 points

MCP Server Ollama
A Model Control Protocol server designed for Claude Desktop to communicate with the Ollama LLM server
Python
9.6K
2.5 points

Ollama Mcpo Adapter
This project is a Python adapter that exposes the tools of MCPO (MCP-to-OpenAPI proxy server) as Ollama-compatible functions, supporting connection to existing MCPO instances or starting a local service.
Python
9.5K
2.5 points

Ollama Chat With MCP
Ollama Chat with MCP is a project that demonstrates how to integrate a local language model with real-time web search functionality through the Model Context Protocol (MCP). It includes an MCP web search server, a terminal client, and a Gradio-based web front-end, enabling locally run LLMs to access external tools and data sources.
Python
9.1K
2.5 points

Simple MCP Ollama Bridge
MCP LLM Bridge is a bridge tool to connect the Model Context Protocol (MCP) server with OpenAI-compatible LLMs (such as Ollama).
Python
6.5K
2.5 points

File Context MCP
File Context MCP is a TypeScript-based application that provides an API to query large language models (LLMs) through local file content. It supports multiple LLM providers (Ollama and Together.ai) and can process multiple file types to generate context-aware responses.
TypeScript
9.2K
2.5 points

Ollama MCP
Ollama MCP Server is a protocol server that connects local Ollama LLM models with MCP - compatible applications, providing functions such as model management and conversation interaction.
TypeScript
14.2K
2.5 points

Ollama
A Model Context Protocol server for integrating Ollama with MCP clients such as Claude Desktop
Python
10.1K
2.5 points
O
Ollama MCP Server
The Ollama MCP Server is a bridge tool that connects the local large language models of Ollama and the Model Context Protocol (MCP). It provides complete API integration, model management, and execution functions, and supports OpenAI - compatible chat interfaces and visual multimodal models.
TypeScript
7.7K
2.5 points

Nano Agent
Nano Agent is an experimental small engineering proxy MCP server that supports multi-provider LLM models, used to test and compare the proxy capabilities of cloud and local LLMs in terms of performance, speed, and cost. The project includes a multi-model evaluation system, a nested agent architecture, and a unified tool interface, supporting providers such as OpenAI, Anthropic, and Ollama.
Python
7.2K
2.5 points

Ragdocs
A RAG service based on the Qdrant vector database and Ollama/OpenAI embedding, providing document semantic search and management functions.
TypeScript
9.4K
2.5 points

Ollama MCP Bridge WebUI
A TypeScript bridge project connecting local LLMs to MCP servers, providing a web interface that enables open-source models to use tool capabilities similar to Claude, supporting functions such as file system operations, web search, and complex reasoning.
TypeScript
8.4K
2.5 points

Document MCP
The MCP Document Indexer is a Python - based local document indexing and search server that uses the LanceDB vector database and a local LLM (through Ollama) to implement real - time monitoring, multi - format document processing, and semantic search, and provides tools for AI assistants such as Claude through the Model Context Protocol (MCP).
Python
6.2K
2.5 points

Homelab MCP
Homelab MCP is a collection of Model Context Protocol servers used to manage and monitor home lab infrastructure through Claude Desktop. It supports multiple services such as Docker, Ollama, Pi-hole, Unifi, and UPS, and offers both unified and independent deployment modes.
Python
6.4K
2.5 points

MCP Ollama Agent
This project implements the integration of the TypeScript MCP proxy with Ollama. Through a unified interface, AI models can call multiple tools, supporting functions such as file operations and web searches.
TypeScript
8.1K
2.5 points

Ollama MCP
The Ollama MCP Server is a powerful bridge that seamlessly integrates Ollama with the Model Context Protocol (MCP), enabling Ollama's local LLM capabilities to be easily incorporated into MCP - driven applications.
TypeScript
8.5K
2.5 points

Multi Model Advisor (Ollama)
The Multi-Model Advisor is a multi - model consulting system based on Ollama. It provides more comprehensive answers to questions by integrating different perspectives from multiple AI models. It uses the 'advisory board' model, allowing Claude to generate responses by synthesizing multiple AI perspectives.
TypeScript
9.0K
2.5 points
R
Rag Server MCP
The MCP RAG Server is a retrieval-augmented generation service based on the Model Context Protocol. It automatically indexes project documents through local tools (ChromaDB and Ollama) and provides context enhancement capabilities for connected LLMs.
TypeScript
6.0K
2.5 points

Local MCP Client
The Local MCP Client is a cross - platform web and API interface tool that interacts with configurable MCP servers through natural language, supports local LLM models and Ollama, and enables structured tool execution and dynamic proxy behavior.
Python
7.9K
2.5 points

Goose With MCP Servers
A Docker image project named Goose, integrated with the MCP server, supporting connection to LLM models through Ollama and adding GitHub MCP services via command - line extensions.
7.2K
2.5 points

Ollama MCP Server
The Ollama-MCP-server is a protocol server that connects local Ollama LLM instances with MCP-compatible applications, providing functions such as task decomposition, result evaluation, and model management, and supporting standardized communication and performance optimization.
Python
9.4K
2.5 points

Tome
Tome is a MacOS application (support for Windows and Linux is coming soon) developed by the Runebook team. It aims to simplify the use of local large language models (LLMs) with MCP servers. By integrating Ollama and managing MCP servers, users can quickly start chatting with MCP-powered models without dealing with complex configurations.
TypeScript
9.1K
2.5 points

Unity Ollama
The Unity MCP and Ollama integration package enables seamless communication between Unity and local large language models, supporting asset management and editor automation.
Python
9.4K
2.5 points

Semantic Context MCP
A semantic code search server based on the MCP protocol, supporting two embedding models, OpenAI and Ollama, capable of indexing local projects or Git repositories, and providing an enterprise - level private code search solution.
TypeScript
4.6K
2.5 points

Newaitees Ollama MCP Server
Ollama-MCP-server is a middleware server that connects the local Ollama large language model. It provides task decomposition, result evaluation, and model management functions through the Model Context Protocol, supporting standardized communication and performance optimization.
Python
8.0K
2 points

Lancedb
A Node.js-based vector search project that uses the LanceDB database and Ollama embedding model to implement document similarity search functionality
JavaScript
8.2K
2 points

MCP Ollama Beeai
A lightweight client application based on the local OLLAMA model, integrating multiple MCP proxy tools through the BeeAI framework, providing a chat interface and database operation functions.
JavaScript
8.6K
2 points

Lancedb Vector Search
Node.js vector search implementation based on LanceDB and Ollama
JavaScript
8.4K
2 points

Streamlit As An MCP Host
An MCP server project based on the Ollama LLM model, providing Wikipedia article retrieval and summary generation functions, including a command-line client and a Streamlit web interface.
Python
7.7K
2 points
C
Code Audit MCP
An AI code audit server based on the local Ollama model. Through the integration of the Model Context Protocol (MCP), it provides multi - dimensional code analysis, including security, integrity, performance, quality, architecture, testing, and documentation checks.
TypeScript
9.2K
2 points
M
MCP Rag Server Rag MCP Server Srm
mcp - rag - server is a Retrieval Augmented Generation (RAG) server based on the Model Context Protocol (MCP). It provides relevant context for connected LLMs by indexing project documents. It uses ChromaDB and Ollama for local storage and embedding generation, supports multiple file formats, and can be quickly deployed using Docker.
TypeScript
8.1K
2 points

Multi Agent Research POC
A local multi-agent research system based on Ollama and BraveSearch, supporting tool calls and collaboration, suitable for AI research tasks.
Python
6.1K
2 points